Refinement of Gini-Means Inequalities and Connections with Divergence Measures
نویسنده
چکیده
In 1938, Gini [3] studied a mean having two parameters. Later, many authors studied properties of this mean. It contains as particular cases the famous means such as harmonic, geometric, arithmetic, etc. Also it contains, the power mean of order r and Lehmer mean as particular cases. In this paper we have considered inequalities arising due to Gini-Mean and Heron’s mean, and improved them based on the results recently studied by the author [13].
منابع مشابه
Inequalities among Differences of Gini Means and Divergence Measures
In 1938, Gini [3] studied a mean having two parameters. Later, many authors studied properties of this mean. In particular, it contains the famous means as harmonic, geometric, arithmetic, etc. Here we considered a sequence of inequalities arising due to particular values of each parameter of Gini’s mean. This sequence generates many nonnegative differences. Not all of them are convex. We have ...
متن کاملRefinement Inequalities among Symmetric Divergence Measures
There are three classical divergence measures in the literature on information theory and statistics, namely, Jeffryes-Kullback-Leiber’s J-divergence, Sibson-Burbea-Rao’s JensenShannon divegernce and Taneja’s arithemtic geometric mean divergence. These bear an interesting relationship among each other and are based on logarithmic expressions. The divergence measures like Hellinger discriminatio...
متن کاملInformation Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملSequences of Inequalities Among New Divergence Measures
Inder Jeet Taneja Departamento de Matemática Universidade Federal de Santa Catarina 88.040-900 Florianópolis, SC, Brazil. e-mail: [email protected] http://www.mtm.ufsc.br/∼taneja Abstract There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber [5, 6] J-divergence. Sibson-BurbeaRao [1] Jensen-Shannon ...
متن کاملSeven Means, Generalized Triangular Discrimination, and Generating Divergence Measures
Jensen-Shannon, J-divergence and Arithmetic-Geometric mean divergences are three classical divergence measures known in the information theory and statistics literature. These three divergence measures bear interesting inequality among the three non-logarithmic measures known as triangular discrimination, Hellingar’s divergence and symmetric chi-square divergence. However, in 2003, Eve studied ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1111.5241 شماره
صفحات -
تاریخ انتشار 2011